117 research outputs found

    Personal Volunteer Computing

    Full text link
    We propose personal volunteer computing, a novel paradigm to encourage technical solutions that leverage personal devices, such as smartphones and laptops, for personal applications that require significant computations, such as animation rendering and image processing. The paradigm requires no investment in additional hardware, relying instead on devices that are already owned by users and their community, and favours simple tools that can be implemented part-time by a single developer. We show that samples of personal devices of today are competitive with a top-of-the-line laptop from two years ago. We also propose new directions to extend the paradigm

    The Grizzly, February 9, 1979

    Get PDF
    False Alarm Leads To Student Arrest • Annual Report Reveals Enrollment Decline • SFARC Disbandment Questioned • ID Crackdown • USGA Election Results • Career Counseling & Placement Services • Letters to the Editor: Snack shop; Zeta Chi; Food waste angers waitress; Theft precautions cited • Roving Reporter: Forums requirement • Ursinus News In Brief: Soviet relations; Basses needed • An Inside View of Alice Cooper • Audio Corner: Turntables • Al Stewart: England\u27s Answer to Bob Dylan • Sheer Energy • Sport Book Review • New Semester; New Offerings • Grapplers Take Two • Bruins Split • Indoor Bears Off and Running • Men\u27s Swim Goes Under • Gymnasts Revenge Pennhttps://digitalcommons.ursinus.edu/grizzlynews/1012/thumbnail.jp

    SeaFlow data v1, high-resolution abundance, size and biomass of small phytoplankton in the North Pacific

    Get PDF
    SeaFlow is an underway flow cytometer that provides continuous shipboard observations of the abundance and optical properties of small phytoplankton (<5 mu m in equivalent spherical diameter, ESD). Here we present data sets consisting of SeaFlow-based cell abundance, forward light scatter, and pigment fluorescence of individual cells, as well as derived estimates of ESD and cellular carbon content of picophytoplankton, which includes the cyanobacteria Prochlorococcus, Synechococcus and small-sized Crocosphaera (<5 mu m ESD), and picophytoplankton and nanophytoplankton (2-5 mu m ESD). Data were collected in surface waters (approximate to 5 m depth) from 27 oceanographic cruises carried out in the Northeast Pacific Ocean between 2010 and 2018. Thirteen cruises provide high spatial resolution (approximate to 1 km) measurements across 32,500 km of the Northeast Pacific Ocean and 14 near-monthly cruises beginning in 2015 provide seasonal distributions at the long-term sampling site (Station ALOHA) of the Hawaii Ocean Time-Series. These data sets expand our knowledge of the current spatial and temporal distributions of picophytoplankton in the surface ocean

    BigDL: A Distributed Deep Learning Framework for Big Data

    Full text link
    This paper presents BigDL (a distributed deep learning framework for Apache Spark), which has been used by a variety of users in the industry for building deep learning applications on production big data platforms. It allows deep learning applications to run on the Apache Hadoop/Spark cluster so as to directly process the production data, and as a part of the end-to-end data analysis pipeline for deployment and management. Unlike existing deep learning frameworks, BigDL implements distributed, data parallel training directly on top of the functional compute model (with copy-on-write and coarse-grained operations) of Spark. We also share real-world experience and "war stories" of users that have adopted BigDL to address their challenges(i.e., how to easily build end-to-end data analysis and deep learning pipelines for their production data).Comment: In ACM Symposium of Cloud Computing conference (SoCC) 201

    Critical analysis of vendor lock-in and its impact on cloud computing migration: a business perspective

    Get PDF
    Vendor lock-in is a major barrier to the adoption of cloud computing, due to the lack of standardization. Current solutions and efforts tackling the vendor lock-in problem are predominantly technology-oriented. Limited studies exist to analyse and highlight the complexity of vendor lock-in problem in the cloud environment. Consequently, most customers are unaware of proprietary standards which inhibit interoperability and portability of applications when taking services from vendors. This paper provides a critical analysis of the vendor lock-in problem, from a business perspective. A survey based on qualitative and quantitative approaches conducted in this study has identified the main risk factors that give rise to lock-in situations. The analysis of our survey of 114 participants shows that, as computing resources migrate from on-premise to the cloud, the vendor lock-in problem is exacerbated. Furthermore, the findings exemplify the importance of interoperability, portability and standards in cloud computing. A number of strategies are proposed on how to avoid and mitigate lock-in risks when migrating to cloud computing. The strategies relate to contracts, selection of vendors that support standardised formats and protocols regarding standard data structures and APIs, developing awareness of commonalities and dependencies among cloud-based solutions. We strongly believe that the implementation of these strategies has a great potential to reduce the risks of vendor lock-in

    Viruses affect picocyanobacterial abundance and biogeography in the North Pacific Ocean

    Get PDF
    The photosynthetic picocyanobacteria Prochlorococcus and Synechococcus are models for dissecting how ecological niches are defined by environmental conditions, but how interactions with bacteriophages affect picocyanobacterial biogeography in open ocean biomes has rarely been assessed. We applied single-virus and single-cell infection approaches to quantify cyanophage abundance and infected picocyanobacteria in 87 surface water samples from five transects that traversed approximately 2,200 km in the North Pacific Ocean on three cruises, with a duration of 2–4 weeks, between 2015 and 2017. We detected a 550-km-wide hotspot of cyanophages and virus-infected picocyanobacteria in the transition zone between the North Pacific Subtropical and Subpolar gyres that was present in each transect. Notably, the hotspot occurred at a consistent temperature and displayed distinct cyanophage-lineage composition on all transects. On two of these transects, the levels of infection in the hotspot were estimated to be sufficient to substantially limit the geographical range of Prochlorococcus. Coincident with the detection of high levels of virally infected picocyanobacteria, we measured an increase of 10–100-fold in the Synechococcus populations in samples that are usually dominated by Prochlorococcus. We developed a multiple regression model of cyanophages, temperature and chlorophyll concentrations that inferred that the hotspot extended across the North Pacific Ocean, creating a biological boundary between gyres, with the potential to release organic matter comparable to that of the sevenfold-larger North Pacific Subtropical Gyre. Our results highlight the probable impact of viruses on large-scale phytoplankton biogeography and biogeochemistry in distinct regions of the oceans

    A Digital Repository and Execution Platform for Interactive Scholarly Publications in Neuroscience

    Get PDF
    The CARMEN Virtual Laboratory (VL) is a cloud-based platform which allows neuroscientists to store, share, develop, execute, reproduce and publicise their work. This paper describes new functionality in the CARMEN VL: an interactive publications repository. This new facility allows users to link data and software to publications. This enables other users to examine data and software associated with the publication and execute the associated software within the VL using the same data as the authors used in the publication. The cloud-based architecture and SaaS (Software as a Service) framework allows vast data sets to be uploaded and analysed using software services. Thus, this new interactive publications facility allows others to build on research results through reuse. This aligns with recent developments by funding agencies, institutions, and publishers with a move to open access research. Open access provides reproducibility and verification of research resources and results. Publications and their associated data and software will be assured of long-term preservation and curation in the repository. Further, analysing research data and the evaluations described in publications frequently requires a number of execution stages many of which are iterative. The VL provides a scientific workflow environment to combine software services into a processing tree. These workflows can also be associated with publications and executed by users. The VL also provides a secure environment where users can decide the access rights for each resource to ensure copyright and privacy restrictions are met

    Social Internet of Things and New Generation Computing -- A Survey

    Full text link
    Social Internet of Things (SIoT) tries to overcome the challenges of Internet of Things (IoT) such as scalability, trust and discovery of resources, by inspiration from social computing. This survey aims to investigate the research done on SIoT from two perspectives including application domain and the integration to the new computing models. For this, a two-dimensional framework is proposed and the projects are investigated, accordingly. The first dimension considers and classifies available research from the application domain perspective and the second dimension performs the same from the integration to new computing models standpoint. The aim is to technically describe SIoT, to classify related research, to foster the dissemination of state-of-the-art, and to discuss open research directions in this field.Comment: IoT, Social computing, Surve
    • …
    corecore